28 research outputs found

    A Machine Learning-based Framework for Predictive Maintenance of Semiconductor Laser for Optical Communication

    Full text link
    Semiconductor lasers, one of the key components for optical communication systems, have been rapidly evolving to meet the requirements of next generation optical networks with respect to high speed, low power consumption, small form factor etc. However, these demands have brought severe challenges to the semiconductor laser reliability. Therefore, a great deal of attention has been devoted to improving it and thereby ensuring reliable transmission. In this paper, a predictive maintenance framework using machine learning techniques is proposed for real-time heath monitoring and prognosis of semiconductor laser and thus enhancing its reliability. The proposed approach is composed of three stages: i) real-time performance degradation prediction, ii) degradation detection, and iii) remaining useful life (RUL) prediction. First of all, an attention based gated recurrent unit (GRU) model is adopted for real-time prediction of performance degradation. Then, a convolutional autoencoder is used to detect the degradation or abnormal behavior of a laser, given the predicted degradation performance values. Once an abnormal state is detected, a RUL prediction model based on attention-based deep learning is utilized. Afterwards, the estimated RUL is input for decision making and maintenance planning. The proposed framework is validated using experimental data derived from accelerated aging tests conducted for semiconductor tunable lasers. The proposed approach achieves a very good degradation performance prediction capability with a small root mean square error (RMSE) of 0.01, a good anomaly detection accuracy of 94.24% and a better RUL estimation capability compared to the existing ML-based laser RUL prediction models.Comment: Published in Journal of Lightwave Technology (Volume: 40, Issue: 14, 15 July 2022

    Fault Monitoring in Passive Optical Networks using Machine Learning Techniques

    Full text link
    Passive optical network (PON) systems are vulnerable to a variety of failures, including fiber cuts and optical network unit (ONU) transmitter/receiver failures. Any service interruption caused by a fiber cut can result in huge financial losses for service providers or operators. Identifying the faulty ONU becomes difficult in the case of nearly equidistant branch terminations because the reflections from the branches overlap, making it difficult to distinguish the faulty branch given the global backscattering signal. With increasing network size, the complexity of fault monitoring in PON systems increases, resulting in less reliable monitoring. To address these challenges, we propose in this paper various machine learning (ML) approaches for fault monitoring in PON systems, and we validate them using experimental optical time domain reflectometry (OTDR) data.Comment: ICTON 202

    Degradation Prediction of Semiconductor Lasers using Conditional Variational Autoencoder

    Full text link
    Semiconductor lasers have been rapidly evolving to meet the demands of next-generation optical networks. This imposes much more stringent requirements on the laser reliability, which are dominated by degradation mechanisms (e.g., sudden degradation) limiting the semiconductor laser lifetime. Physics-based approaches are often used to characterize the degradation behavior analytically, yet explicit domain knowledge and accurate mathematical models are required. Building such models can be very challenging due to a lack of a full understanding of the complex physical processes inducing the degradation under various operating conditions. To overcome the aforementioned limitations, we propose a new data-driven approach, extracting useful insights from the operational monitored data to predict the degradation trend without requiring any specific knowledge or using any physical model. The proposed approach is based on an unsupervised technique, a conditional variational autoencoder, and validated using vertical-cavity surface-emitting laser (VCSEL) and tunable edge emitting laser reliability data. The experimental results confirm that our model (i) achieves a good degradation prediction and generalization performance by yielding an F1 score of 95.3%, (ii) outperforms several baseline ML based anomaly detection techniques, and (iii) helps to shorten the aging tests by early predicting the failed devices before the end of the test and thereby saving costsComment: Published in: Journal of Lightwave Technology (Volume: 40, Issue: 18, 15 September 2022

    Impact of Liquid Crystal Based Interference Mitigation and Precoding on the Multiuser Performance of VLC Massive MIMO Arrays

    Get PDF
    In visible light communication systems, the ability to suppress interference caused by other light sources is a major benefit towards performance improvements. Especially for large transmitter arrays or even multi-cell arrangements, the interference problem needs to be handled. In previous work, we have presented a liquid crystal display (LCD) used as an adaptive interference-suppression filter mounted in front of each photodetector. The display elements are switched on and off in such a way that light emitted by unwanted light sources ideally is blocked, but light emitted by desired light sources reaches the detector. The pattern generated by the LC display has strong impact on the system performance. In this paper, we propose combined precoding in conjunction with LCD-based interference suppression in order to increase the signal-to-interference-plus-noise ratio and to ensure user fairness in massive MIMO scenarios. The suggested precoding strategy uses a new heuristic optimization approach based on the Santa Claus problem on unrelated machines known from computer sciences, and employs only binary entries in the weighting matrix. Corresponding results are compared with a genetic evolutionary optimization strategy and with conventional zero-forcing precoding. Regarding performance evaluation, we perform numerical ray-tracing simulations and present a room-scale VLC testbed for experimental verification

    Deep Neural Network Equalization for Optical Short Reach Communication

    Get PDF
    Nonlinear distortion has always been a challenge for optical communication due to the nonlinear transfer characteristics of the fiber itself. The next frontier for optical communication is a second type of nonlinearities, which results from optical and electrical components. They become the dominant nonlinearity for shorter reaches. The highest data rates cannot be achieved without effective compensation. A classical countermeasure is receiver-side equalization of nonlinear impairments and memory effects using Volterra series. However, such Volterra equalizers are architecturally complex and their parametrization can be numerical unstable. This contribution proposes an alternative nonlinear equalizer architecture based on machine learning. Its performance is evaluated experimentally on coherent 88 Gbaud dual polarization 16QAM 600 Gb/s back-to-back measurements. The proposed equalizers outperform Volterra and memory polynomial Volterra equalizers up to 6th orders at a target bit-error rate (BER) of 10 −2 by 0.5 dB and 0.8 dB in optical signal-to-noise ratio (OSNR), respectively

    Mitigation of Nonlinear Impairments by Using Support Vector Machine and Nonlinear Volterra Equalizer

    Get PDF
    A support vector machine (SVM) based detection is applied to different equalization schemes for a data center interconnect link using coherent 64 GBd 64-QAM over 100 km standard single mode fiber (SSMF). Without any prior knowledge or heuristic assumptions, the SVM is able to learn and capture the transmission characteristics from only a short training data set. We show that, with the use of suitable kernel functions, the SVM can create nonlinear decision thresholds and reduce the errors caused by nonlinear phase noise (NLPN), laser phase noise, I/Q imbalances and so forth. In order to apply the SVM to 64-QAM we introduce a binary coding SVM, which provides a binary multiclass classification with reduced complexity. We investigate the performance of this SVM and show how it can improve the bit-error rate (BER) of the entire system. After 100 km the fiber-induced nonlinear penalty is reduced by 2 dB at a BER of 3.7 × 10 −3 . Furthermore, we apply a nonlinear Volterra equalizer (NLVE), which is based on the nonlinear Volterra theory, as another method for mitigating nonlinear effects. The combination of SVM and NLVE reduces the large computational complexity of the NLVE and allows more accurate compensation of nonlinear transmission impairments

    Optically Enabled ADCs and Application to Optical Communications

    Get PDF
    Electrical-optical signal processing has been shown to be a promising path to overcome the limitations of state-of-the-art all-electrical data converters. In addition to ultra-broadband signal processing, it allows leveraging ultra-low jitter mode-locked lasers and thus increasing the aperture jitter limited effective number of bits at high analog signal frequencies. In this paper, we review our recent progress towards optically enabled time- and frequency-interleaved analog-to-digital converters, as well as their monolithic integration in electronic-photonic integrated circuits. For signal frequencies up to 65 GHz, an optoelectronic track-and-hold amplifier based on the source-emitter-follower architecture is shown as a power efficient approach in optically enabled BiCMOS technology. At higher signal frequencies, integrated photonic filters enable signal slicing in the frequency domain and further scaling of the conversion bandwidth, with the reconstruction of a 140 GHz optical signal being shown. We further show how such optically enabled data converter architectures can be applied to a nonlinear Fourier transform based integrated transceiver in particular and discuss their applicability to broadband optical links in general
    corecore